25 research outputs found

    A Concurrent Spectral-Screening PCT Algorithm For Remote Sensing Applications

    Get PDF
    The paper presents a concurrent algorithm for remote sensing applications that provides significant performance and image quality enhancements over conventional uniprocessor PCT techniques. The algorithm combines spectral angle classification, principal component transform, and human centered color mapping. It is evaluated from an image quality perspective using images collected with the Hyper-spectral Digital Imagery Collection Experiment (HYDICE) sensor, an airborne imaging spectrometer. These images correspond to foliated scenes taken from an altitude of 2000 to 7500 meters at wavelengths between 400nm and 2.5 micron. The scenes contain mechanized vehicles sitting in open fields as well as under camouflage. The algorithm operates with close to linear speedup on shared memory multiprocessors and can be readily extended to operate on multiple, low-cost PC-style servers connected with high-performance networking. A simple analytical model is outlined that allows the impact on performance of practical, application-specific properties to be assessed. These properties include image resolution, number of spectral bands, increases in the number of processors, changes in processor technology, networking speeds, and system clock rates

    Resilient Image Fusion

    Get PDF
    The paper describes a distributed spectral-screening PCT algorithm for fusing hyper-spectral images in remote sensing applications. The algorithm provides intrusion tolerance from information warfare attacks using the notion of computational resiliency. This concept uses replication to achieve fault tolerance, but goes further to dynamically regenerate replication in response to an attack or failure. The concepts of resiliency are incorporated through library technology that is application independent. This library hides the details of communication protocols required to achieve dynamic replication and reconfiguration in distributed applications. The paper provides a status report on our progress in developing the concept and applying it to image fusion. In particular we examine the performance of the PCT algorithm and compare the results with and without resiliency to assess the associated overhead

    Resilient Image Fusion

    Get PDF
    The paper describes a distributed spectral-screening PCT algorithm for fusing hyper-spectral images in remote sensing applications. The algorithm provides intrusion tolerance from information warfare attacks using the notion of computational resiliency. This concept uses replication to achieve fault tolerance, but goes further to dynamically regenerate replication in response to an attack or failure. The concepts of resiliency are incorporated through library technology that is application independent. This library hides the details of communication protocols required to achieve dynamic replication and reconfiguration in distributed applications. The paper provides a status report on our progress in developing the concept and applying it to image fusion. In particular we examine the performance of the PCT algorithm and compare the results with and without resiliency to assess the associated overhead

    MathWeb: A Concurrent Image Analysis Tool Suite for Multi-spectral Data Fusion

    Get PDF
    This paper describes a preliminary approach to the fusion of multi-spectral image data for the analysis of cervical cancer. The long-term goal of this research is to define spectral signatures and automatically detect cancer cell structures. The approach combines a multi-spectral microscope with an image analysis tool suite, MathWeb. The tool suite incorporates a concurrent Principal Component Transform (PCT) that is used to fuse the multi-spectral data. This paper describes the general approach and the concurrent PCT algorithm. The algorithm is evaluated from both the perspective of image quality and performance scalability

    Concurrent algorithms and performance modeling for multi-spectral image fusion applications

    No full text
    The thesis presents a collection of novel concurrent algorithms and their associated analytical models. The algorithms combine spectral angle classification, the principal component transform, and human centered color mapping. They fuse a multi- or hyper-spectral image set into a single color composite image that maximizes the impact of spectral variation on the human visual system. To demonstrate the utility of the algorithms, they are evaluated from an image quality perspective using images collected from HYDICE sensor, a multi-spectral microscope, and image streams that emanate from a real-time, multi-spectral camera. The algorithms are supported with predictive analytical models that allow performance to be assessed for a wide variety of typical variations in use. For example, changes to the number of spectra, image resolution, processor speed, memory size, network bandwidth/latency, and granularity of decomposition. The motivation in building performance models is to assess the impact of changes in technology and problem size associated with different applications, allowing cost-performance tradeoffs to be assessed

    A Distributed Spectral-Screening Pct Algorithm

    Get PDF
    This paper describes a novel distributed algorithm for use in remote-sensing, medical image analysis, and surveillance applications. The algorithm combines spectral-screening classification with the principal component transform (PCT), and human-centered mapping. It fuses a multi- or hyper-spectral image set into a single color composite image that maximizes the impact of spectral variation on the human visual system. The algorithm operates on distributed collections of shared-memory multiprocessors that are connected through high-performance networking. Scenes taken from a standard 210 frame remote-sensing data set, collected with the Hyper-spectral Digital Imagery Collection Experiment (HYDICE) airborne imaging spectrometer, are used to assess the algorithms image quality, performance, and scaling. The algorithm is supported with a predictive analytical model that allows its performance to be assessed for a wide variety of typical variations in use. For example, changes to the number..

    PHH: Policy-Based Hyper-Heuristic With Reinforcement Learning

    No full text
    Hyper-heuristics have a high level of generality and adaptability, allowing them to effectively solve a wide range of complex optimization problems. With reinforcement learning, hyper-heuristics can use experience and knowledge gained to tackle unforeseen problems, allowing hyper-heuristics to adapt and improve over time. Our paper proposes a framework for using policy-based reinforcement learning to improve the performance of hyper-heuristics. The framework trains hyper-heuristic agents to select the best generalized constructive low-level heuristics to solve combinatorial optimization problems. The framework evaluation was performed using three benchmarking problems: traveling salesman, capacitated vehicle routing, and bin packing problems. The results showed that the proposed framework can outperform existing meta-heuristic and hyper-heuristic-based algorithms for all large problem instances in all problem domains. The proposed framework was also evaluated by applying it to a cost optimization problem for workflow scheduling on a hybrid cloud with a deadline constraint. Eight agents were trained on medium-sized workflows with two deadlines and tested against traditional meta-heuristic and hyper-heuristic methods to solve smaller and larger workflows with unforeseen deadlines. Four workflow applications, three workflow sizes, and three deadlines were used in the evaluation. The results showed that our proposed framework provided significantly better solutions of up to 98% for benchmarking problems, and up to 22% for cost optimization in workflow scheduling. Moreover, trained with small problem instances, the framework performed well for unforeseen larger problem instances implying its generalization. The proposed framework, thus, has the potential to improve both the generality and performance of solving large combinatorial optimization problems
    corecore